Preventing oversmoothing in VAE via generalized variance parameterization

نویسندگان

چکیده

Variational autoencoders (VAEs) often suffer from posterior collapse, which is a phenomenon in the learned latent space becomes uninformative. This related to hyperparameter resembling data variance. It can be shown that an inappropriate choice of this causes oversmoothness linearly approximated case and empirically verified for general cases. Moreover, determining such appropriate infeasible if variance non-uniform or conditional. Therefore, we propose VAE extensions with generalized parameterizations incorporate maximum likelihood estimation into objective function adaptively regularize decoder smoothness. The images generated proposed show improved Fréchet inception distance (FID) on MNIST CelebA datasets.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

VAE Learning via Stein Variational Gradient Descent

A new method for learning variational autoencoders (VAEs) is developed, based on Stein variational gradient descent. A key advantage of this approach is that one need not make parametric assumptions about the form of the encoder distribution. Performance is further enhanced by integrating the proposed encoder with importance sampling. Excellent performance is demonstrated across multiple unsupe...

متن کامل

Inferences on the Generalized Variance under Normality

Generalized variance is applied for determination of dispersion in a multivariate population and is a successful measure for concentration of multivariate data. In this article, we consider constructing confidence interval and testing the hypotheses about generalized variance in a multivariate normal distribution and give a computational approach. Simulation studies are performed to compare thi...

متن کامل

VAE with a VampPrior

Many different methods to train deep generative models have been introduced in the past. In this paper, we propose to extend the variational auto-encoder (VAE) framework with a new type of prior which we call "Variational Mixture of Posteriors" prior, or VampPrior for short. The VampPrior consists of a mixture distribution (e.g., a mixture of Gaussians) with components given by variational post...

متن کامل

Generalized Analysis of Molecular Variance

Many studies in the fields of genetic epidemiology and applied population genetics are predicated on, or require, an assessment of the genetic background diversity of the individuals chosen for study. A number of strategies have been developed for assessing genetic background diversity. These strategies typically focus on genotype data collected on the individuals in the study, based on a panel...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neurocomputing

سال: 2022

ISSN: ['0925-2312', '1872-8286']

DOI: https://doi.org/10.1016/j.neucom.2022.08.067